Skip to content
This repository was archived by the owner on Jul 4, 2025. It is now read-only.

Conversation

@louis-jan
Copy link
Contributor

@louis-jan louis-jan commented Jun 17, 2024

Describe Your Changes

  • This PR ships the ONNX runtime on Windows:
    • The ONNX engine is quite lightweight (14mb), so we've opted to bundle it alongside the llama.cpp engine.
    • Post-install step would download both engines to cortex directory.
    • During installation, a few files that do not belong to the engine will be moved accordingly by cortex-cpp.

NOTES: Since this PR, we have switched to using cortexhub as our default hub instead of the janhq HF repo

Fixes Issues

  • Closes #
  • Closes #

Self Checklist

  • Added relevant comments, esp in complex areas
  • Updated docs (for bug fixes / features)
  • Created issues for follow-up changes or refactoring needed

@louis-jan louis-jan force-pushed the feat/ship-onnx-engine-on-windows branch from 5c2e3a4 to 18cfb0b Compare June 18, 2024 06:51
@louis-jan louis-jan force-pushed the feat/ship-onnx-engine-on-windows branch from 262d836 to 31d9d71 Compare June 18, 2024 07:51
@louis-jan louis-jan merged commit 5f57033 into dev Jun 18, 2024
@louis-jan louis-jan deleted the feat/ship-onnx-engine-on-windows branch June 18, 2024 07:55
cahyosubroto pushed a commit that referenced this pull request Jun 26, 2024
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants